Now that we have the basic understanding of raytracing and how it works within nkGraphics, it is time to make an effect hardly reachable with rasterization. During this tutorial, we will see how we can push raytracing a bit further to use different resources, and how nkGraphics allows us to do so by reusing the same API as for past tutorials !
During this tutorial, we will use raytracing to get back to the result we had with rasterization : an environment map with a sphere reflecting it. However, we will introduce a second sphere to make things spicier ! Indeed, with default rasterization, both sphere would not be able to easily reflect each other... But this is something we will get right away with raytracing !
Here is the result we will get after going through this tutorial :
Let's dig in without waiting !
Starting from last tutorial, what we will need is a more complex shader when missing geometry. This will be what will be painting our background : when we don't hit any geometry, simply sample the environment map we used up till now. Let's see the updated Program :
The raygen program is left untouched. However, the miss program will now sample the environment map, and thus, we add the TextureCube and SamplerState declarations. It uses the provided WorldRayDirection function to retrieve ray's direction in world space and sample from it. Note that we cannot use the Sample function in a raytracing stage. This is because like compute stages, we don't work with pixel groups, and thus DirectX cannot derive a mip level to use for a given pixel.
Now that we have a new texture and a new sampler used, we need to feed them from the Shader. From what we had in last tutorial, we only need to add :
Which is basically how we would do it for any shader, by linking a resource to its slot.
With this, our raygen / miss shader is ready to go ! If we were to launch the program now, we would already see the background has changed to our environment map. Now to work with the geometry itself !
As we mentioned, we want to add shiny reflections on the spheres that will use the shader. Let's update the program accordingly :
Here we are introducing quite some new resources / data structure changes. Let's go through them in order :
This last point is quite special, as this is something we don't really link in any shader slot. For a given mesh that is part of a raytraced render queue, nkGraphics will link the geometry attached to it during a hit. If you need to access such data, you need to name your slots _vertexData and _indexData respectively. The slots need to be texture slots, with no constraint on the index, which will be detected when the program gets compiled.
This information is accessed through StructuredBuffer structures. For indices, they come as triangle indices, thus as uint3. However, the vertex data is a little bit more involved to get right.
Currently, there is no way for nkGraphics to know how the mehes are laid out. This means you need yourself to get the structures right, with the right attributes types and ordering. Else, you might end up sampling data incorrectly, as the offsets will be wrong. A consequence is that one hit program can only be linked to one geometry layout type, as there is no way of changing how a buffer is interpreted within the same function.
As of now, this is the biggest constraint when accessing geometry data, as there is no pipeline stage feeding the data layout for us. Here, we have a sphere mesh presenting positions, uvs, and normals. The ordering is made like the one described in the hlsl, interleaved. Correctly aligning this ensures we are correctly sampling the normals in the function.
Talking about the function, the way to go to sample geometry is by using the barycentric informations from the intersection, provided by the hit stage. Using this, we can recompute the way we need to interpolate between a triangle's 3 points attributes, that we will find using the PrimitiveIndex function offered to us. From there, we can reconstruct the geometry normal and the intersection position, which enables us to reflect our ray.
Now the magic part about raytracing begins ! We can now reconstruct a new ray, starting from our intersection, going in the reflected direction. We can then trace the scene again with this new ray, simply forwarding the payload so that it gets altered deeper in the recursion.
Note that we are tracking the recursion depth of a ray, and preventing a new ray from being launched if we are hitting a certain threshold. This is to prevent any driver crash from happening : going past the max recursion depth specified in a pass will at best crash the driver and application. Because DXR allocates as little resource as possible when pre-building everything, going further than that might lead it to read / write on unwanted memory regions, causing device removals. We will see how we can alter this max recursion depth when working on composition.
This concludes the big changes on the hit function, let's see what we need to add in to the Shader :
As mentioned earlier :
Now that our shaders are ready, let's see what we need to update in our composition.
As mentioned in the hit program update, we are now recursively tracing rays. This requires us to be aware of what we are doing in there : as mentioned, getting it badly will crash the driver, as it will most probably read from unwanted memory addresses.
When setting up the RaytracingPass, it is possible to override the max recursion depth you expect to have. Here, we will work with 6 bounces, as this should be sufficient to get good fidelity in the scene. But remember the security we implemented in the hit function : this is because theoretically, we could have many more bounces between both spheres, depending on the angle of the ray. As such, we needed to find a good balance between what could be done and what we want.
Remember that this should be kept at a minimum possible, for better performances.
After having overridden the max recursion depth, what is left is adding the second sphere to the scene :
The usual business when working with render queues and graph nodes ! With the difference, of course, that we need to give the raytracing shader, for our RaytracingPass.
One final detail that won't be detailed here is that we also make this new sphere move around the original sphere, through :
Done each frame in the custom rendering loop we wrote in the 4th tutorial. Note that there is no special trick again : update the node, like we would do with rasterized passes. This will trigger the acceleration structures updates for us.
Beware though : the update is asynchronous. This means that if you need to move something for a static image that will get generated right after, you will need to flush rendering through Renderer::flushRendering().
We can now test the program with all the changes :
Notice how both spheres reflect each other, in real time ! This is a tricky effect to reach with pure rasterization, yet with raytracing we naturally get it by recursively tracing the scene on an intersection. Perfectly showcasing why raytracing is a big thing !
Through this tutorial, we saw how to push raytracing to witness how powerful it can be. We also saw how nkGraphics allows us to do so, without requesting much changes to what we already know from the rasterization tutorials we had before.
Now we know about :
That makes a lot of points not really new, in fact. And now that we know about all of that, the only thing remaining is using this new knowledge to build always more powerful effects !